Multi-class boosting with asymmetric binary weak-learners

نویسندگان

  • Antonio Fernández-Baldera
  • Luis Baumela
چکیده

We introduce a multi-class generalization of AdaBoost with binary weaklearners. We use a vectorial codification to represent class labels and a multiclass exponential loss function to evaluate classifier responses. This representation produces a set of margin values that provide a range of punishments for failures and rewards for successes. Moreover, the stage-wise optimization of this model introduces an asymmetric boosting procedure whose costs depend on the number of classes separated by each weak-learner. In this way the boosting procedure takes into account class imbalances when building the ensemble. The experiments performed compare this new approach favorably to AdaBoost.MH, GentleBoost and the SAMME algorithms.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

A Direct Approach to Multi-class Boosting and Extensions

Boosting methods combine a set of moderately accurate weak learners to form a highly accurate predictor. Despite the practical importance of multi-class boosting, it has received far less attention than its binary counterpart. In this work, we propose a fully-corrective multi-class boosting formulation which directly solves the multi-class problem without dividing it into multiple binary classi...

متن کامل

Effect of Pruning and Early Stopping on Performance of a Boosting Ensemble

Generating an architecture for an ensemble of boosting machines involves making a series of design decisions. One design decision is whether to use simple “weak learners” such as decision tree stumps or more complicated weak learners such as large decision trees or neural networks. Another design decision is the training algorithm for the constituent weak learners. Here we concentrate on binary...

متن کامل

A Simple Multi-Class Boosting Framework with Theoretical Guarantees and Empirical Proficiency

There is a need for simple yet accurate white-box learning systems that train quickly and with little data. To this end, we showcase REBEL, a multi-class boosting method, and present a novel family of weak learners called localized similarities. Our framework provably minimizes the training error of any dataset at an exponential rate. We carry out experiments on a variety of synthetic and real ...

متن کامل

Fast Training of Effective Multi-class Boosting Using Coordinate Descent Optimization

We present a novel column generation based boosting method for multi-class classification. Our multi-class boosting is formulated in a single optimization problem as in [1, 2]. Different from most existing multi-class boosting methods, which use the same set of weak learners for all the classes, we train class specified weak learners (i.e., each class has a different set of weak learners). We s...

متن کامل

Improved Multi-Class Cost-Sensitive Boosting via Estimation of the Minimum-Risk Class

We present a simple unified framework for multi-class cost-sensitive boosting.The minimum-risk class is estimated directly, rather than via an approximationof the posterior distribution. Our method jointly optimizes binary weak learnersand their corresponding output vectors, requiring classes to share features at eachiteration. By training in a cost-sensitive manner, weak le...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:
  • Pattern Recognition

دوره 47  شماره 

صفحات  -

تاریخ انتشار 2014